Density estimation by total variation penalized likelihood driven by the sparsity `1 information criterion
نویسنده
چکیده
We propose a non-linear density estimator, which is locally adaptive, like wavelet estimators, and positive everywhere, without a logor root-transform. This estimator is based on maximizing a non-parametric log-likelihood function regularized by a total variation penalty. The smoothness is driven by a single penalty parameter, and to avoid cross-validation, we derive an information criterion based on the idea of universal penalty. The penalized log-likelihood maximization is reformulated as an l1-penalized strictly convex programme whose unique solution is the density estimate. A Newton-type method cannot be applied to calculate the estimate because the l1-penalty is non-differentiable. Instead, we use a dual block coordinate relaxation method that exploits the problem structure. By comparing with kernel, spline and taut string estimators on a Monte Carlo simulation, and by investigating the sensitivity to ties on two real data sets, we observe that the new estimator achieves good L1 and L2 risk for densities with sharp features, and behaves well with ties. SARDY, Sylvain, TSENG, Paul. Density Estimation by Total Variation Penalized Likelihood Driven by the Sparsity l1 Information Criterion. Scandinavian Journal of Statistics, 2010, vol. 37, no. 2, p. 321 337 DOI : 10.1111/j.1467-9469.2009.00672.x
منابع مشابه
Density estimation by total variation penalized likelihood driven by the sparsity ` 1 information criterion
We propose a density estimator based on penalized likelihood and total variation. Driven by a single smoothing parameter, the nonlinear estimator has the properties of being locally adaptive and positive everywhere without a logor root-transform. For the fast selection of the smoothing parameter we employ the sparsity `1 information criterion. Furthermore the estimated density has the advantage...
متن کاملCopula Density Estimation by Total Variation Penalized Likelihood with Linear Equality Constraints
A copula density is the joint probability density function (PDF) of a random vector with uniform marginals. An approach to bivariate copula density estimation is introduced that is based on a maximum penalized likelihood estimation (MPLE) with a total variation (TV) penalty term. The marginal unity and symmetry constraints for copula density are enforced by linear equality constraints. The TV-M...
متن کاملCopula Density Estimation by Total Variation Penalized Likelihood
A copula density is the joint probability density function (PDF) of a random vector with uniform marginals. An approach to bivariate copula density estimation is introduced that is based on a maximum penalized likelihood estimation (MPLE) with a total variation (TV) penalty term. The marginal unity and symmetry constraints for copula density are enforced by linear equality constraints. The TV-M...
متن کاملMDL Procedures with ` 1 Penalty and their Statistical Risk Updated August 15 , 2008 Andrew
We review recently developed theory for the Minimum Description Length principle, penalized likelihood and its statistical risk. An information theoretic condition on a penalty pen(f) yields the conclusion that the optimizer of the penalized log likelihood criterion log 1/likelihood(f) + pen(f) has risk not more than the index of resolvability, corresponding to the accuracy of the optimizer of ...
متن کاملMDL Procedures with `1 Penalty and their Statistical Risk
We review recently developed theory for the Minimum Description Length principle, penalized likelihood and its statistical risk. An information theoretic condition on a penalty pen(f) yields the conclusion that the optimizer of the penalized log likelihood criterion log 1/likelihood(f) + pen(f) has risk not more than the index of resolvability, corresponding to the accuracy of the optimizer of ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2017